28 - Diagnostic Medical Image Processing (DMIP) [ID:2099]
50 von 394 angezeigt

The following content has been provided by the University of Erlangen-Nürnberg.

So welcome to the Monday afternoon, late afternoon session. For the last week of this semester,

I will present to you first of all a few practical issues related to mutual information based

registration approaches. And this is actually a scientific talk I gave a few years ago,

two and a half years ago. And this basically tells you what we are actually doing in research,

or what we did two years ago in research with respect to normalized mutual information or

mutual information in general. And you will be surprised what type of questions still

arise in this context and how we, and I will show to you how we have tackled it. Tomorrow

I will summarize basically the core topics that we have considered within the lecture,

except reconstruction where I only partially know what exactly Andreas told you, but I

have the slides and I will be well prepared for the oral exam, as you can imagine. Okay.

So please remember what is mutual information based image registration about. And basically

you should keep in mind one important formula, and this important formula is basically the

Kalbheg-Leibler divergence of joints versus product probabilities. So we sum over two

random measures and consider the joint density, that is the probability that we have observed

both of the random measures, logarithm P of A, B, P of A times P of B. This is a similarity

measure that compares the joint density with the product density. And if these two probabilities

are the same, then we basically have statistically independent random measures and this measure

goes down to zero. If you think about this here, if you have, for instance, A is element

of one to N and B is element of, let's say, one to N. So we have N different values for

A and N different values for B. It's important to see that here A and B are not required

to be out of the same domain, right? Can have different values. And the values can come

out of different sets of possible values. So how many values do I have to store in my

computer, my medical imaging device for medical engineers? How many values do I have to store

to have a good or to have a proper representation of the joint density? So if I want to store

P of A, B, then basically I have here my A values and here my B values and here I have

M and here I have N and then I have basically the table of relative frequencies and here

I have my entries. This is the probability of A is one, B is one, A is two, B is one,

two, three, four, and so on. So how many values do I have to store? It's N by M. I get a matrix

with M by N entries. So M by N entries. So think about the following problem. We want

to estimate the joint density out of observations. So we need to look how often do I see the

pair two, one compared to the total number of values I have observed. How often do I

see the pair two, three? And then I put in here the relative frequency. And you can imagine

that you need way more observations to estimate these values here than just M by N. So it's

basically quadratic in the number of possible values of A and B. If you look at the estimation

of P of A, how many entries are you required to estimate? How many entries are you required

to estimate? Roman? M, N, N. So here you just estimate the histogram with N entries. So

I have here N. And P of B, I have M entries. So if I want to build this joint histogram

here, I need M times N elements. If I want to compute this product probability, I have

to compute M plus N entries. So the product here turns out to be a sum in this independency

case. And this is a way where pattern recognition image processing people quite often make use

of independency assumptions. If you have no chance to get a good estimate of the histogram

here, you approximate it by the product, which is of course a failure in this context. But

it just shows you how the dimensions behave. Here you have the product and here you have

just the sum for estimates. Here you estimate N probabilities and M probabilities and here

you have to estimate N times M probabilities. You see the difference? No? At least I told

you the difference. So basically what we are considering heavily is the processing of images

like this one. This is an X-ray image of the heart. And here you see the ribs and here

you see certain parts of the thorax and here you see a different view on the heart. And

what we want to do is to do registration for these images basically. And there you don't

Zugänglich über

Offener Zugang

Dauer

00:41:30 Min

Aufnahmedatum

2012-02-06

Hochgeladen am

2012-02-07 12:27:34

Sprache

en-US

Einbetten
Wordpress FAU Plugin
iFrame
Teilen